CASE #2: Human Factor Design

"Structurally sound aircraft plummet to the earth, ships run aground in calm seas, industrial machines run awry, and the instruments of medical science maim and kill unsuspecting patients, all because of incompatibilities between the way things are designed and the way people perceive, think, and act." Casey, 1993

What is Human Error, Why does it Occur?

Historical Perspective

Human factors engineering was probably born in the 1900's when Frank and Lillian Gilbreth studied surgical operating teams and realized that they could improve efficiency of the surgeon by instituting the procedure that is still used today where instruments are handed to the surgeon as he/she calls out for it and extends a hand to the nurse or technologist. (see video sample if you are not familiar). Most likely you have seen this dramatized in movies and on TV. Unfortunately, the idea of adapting machines for more efficient human use did not catch on.

During World War 2 it was demonstrated that even with extensive training, error rates continued. Post WWII they established "engineering psychology laboratories" also known as "human factors", "human usability engineering" and "ergonomics." We will use the term "human factors" the most common term used in the United States. BUT still not enough interest was generated to prevent the next catastrophe.

In 1979 a cataclysmic event brought public awareness to the importance of "human factors" engineering. At the Three-mile Island nuclear power plant a simple clog in the feedwater lines of Turbine No. 1 triggered a series of events that would unnecessarily cause a nuclear meltdown. A detailed study of the Three Mile Island incident revealed that no single fault, mistake, event, or malfunction caused the situation, but rather the responsibility was distributed across a number of sources. Human error was involvved at several levels; incorrect decisions, design of a relief valve system, overwhelming complexity of the information presented to the human operators, and confusing display format. These were enough to guarantee the human ability to attend, perceive, remember, decide, and act would become overloaded.

(ONLY NEED TO WATCH THE FIRST 4 MINUTES OF THIS VIDEO)


DON'T be quick to judge an individual player (operators, engineers, company) as being at fault or that there is a simple fix to the situation at Three-mile Island. The interactions or sociotechnical interface of humans and machines is much more complex. The operators at Three-mile Island were able to understand most of the information BUT one display was misleading. The design of the display for the pressure valve supposedly indicated what was to happen, not what actually happened. The operators mistakenly believed that the valve was shut and acted on that information with dire consequences. As was pointed out in the video the engineers didn't anticipate operator reactions.
Setting your sights on the entire socio-technical system will help you avoid the trap of finding a single point of blame. It is easy, for instance, to decide that the operators made serious mistakes and to end one’s analysis there. This is a short-sighted approach. It would miss the problems with maintenance in the facility (maintenance crews who had gone off duty had blocked the alternative feedwater system to the reactors); it would miss the incomplete reporting requirements for the company; it would miss the inadequate and misleading testing of the nuclear reactor system.

Human Factors and Medical Device Design:


Many human-machine systems do not work as well as possible because requirements are imposed on the human user that are incompatible with the way the user attends, perceives, thinks, remembers, decides, and responds. In other words, the way in which he or she processes information. Technological development has become increasingly complex.
Many medical software companies (i.e., SGS, HBOC, and Cerner) are just beginning to embrace human factors in the design of their programs. IBM and other large information systems companies have well-established human factors departments and are involved in medical software projects. (something to consider if you are involved in purchasing medical equipment!). Medical devices designed without thought for human factors can lead to patient injuries and even death. Increasingly medical equipment is being used by laypeople in home settings and other alternative settings. The end user may now be the patient rather than the clinician.
For devices to be used safely and effectively, the design for the interaction between user and machine must acknowledge users' capabilities, stress levels, working environments, and training needs. Studies by the Center for Devices and Radiological Health (CDRH) state that many of the causes of serious equipment errors in use are a direct result of poor equipment design and cannot be overcome by improved traingof the users. The CDRH studeies demonstrate that even small, seemingly unimportant design flaws can result in significant user problems. Identified as potential areas of concern are: knobs and dials, device markings, legibility of information, tubings and connectors, software design. The ability of the manufacturer to develop sophisticated yet easy-to-understand and easy-to-use software is of critical importance. It is not unsual for a device to contain control keys that have multiple functions or require the user to depress keys in a specific order. A commons example is the use of "control", "Alt", and "delete" to reboot a personal computer.

Team Assignment

The Case of "Malfunction 54" with a focus on Patient #5
Ineffective design can lead to significant user error, potential patient harm, and the high cost of healthcare, as related in the second case for our Wiki Project.

Instructions:

Using the background information below review the details of this case. We will focus on the experience of Patient #5, Ray Voyne Fox. Read his story first. As a team do the following:
1. Choose two of the factors that contributed to the resulting patient harm.
2. Classify each error based on the sociotechnical model.
3. Compare the two errors of Malfunction 54 to similar errors at Three Mile Island.
For this case project you DO NOT need to collect any outside resources. Everything you need to complete this task is in the information below. This does not preclude you from going beyond this information, but you are not required to seek additional resources.

Background Information: